Robustifying AdaBoost by Adding the Naive Error Rate

نویسندگان

  • Takashi Takenouchi
  • Shinto Eguchi
چکیده

AdaBoost can be derived by sequential minimization of the exponential loss function. It implements the learning process by exponentially reweighting examples according to classification results. However, weights are often too sharply tuned, so that AdaBoost suffers from the nonrobustness and overlearning. Wepropose a new boosting method that is a slight modification of AdaBoost. The loss function is defined by a mixture of the exponential loss and naive error loss functions. As a result, the proposed method incorporates the effect of forgetfulness into AdaBoost. The statistical significance of our method is discussed, and simulations are presented for confirmation.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An Evaluation of Machine Learning-Based Methods for Detection of Phishing Sites

In this paper, we evaluate the performance of machine learningbased methods for detection of phishing sites. In our previous work [1], we attempted to employ a machine learning technique to improve the detection accuracy. Our preliminary evaluation showed the AdaBoost-based detection method can achieve higher detection accuracy than the traditional detection method. Here, we evaluate the perfor...

متن کامل

Experiments on Spam Detection with Boosting, Svm and Naive Bayes

For this project, I implement 3 popular text classification algorithms on spam detection, namely AdaBoost, Support Vector Machines and Naive Bayes. The performance are evaluated on some testing datasets. All experiments are done in Matlab. The experimental result is, all 3 algorithms have a satisfactory performance on spam detection. In term of accuracy, Adaboost has the best error bound. On th...

متن کامل

A Study of AdaBoost with Naive Bayesian Classifiers: Weakness and Improvement

This article investigates boosting naive Bayesian classification. It first shows that boosting does not improve the accuracy of the naive Bayesian classifier as much as we expected in a set of natural domains. By analyzing the reason for boosting’s weakness, we propose to introduce tree structures into naive Bayesian classification to improve the performance of boosting when working with naive ...

متن کامل

Automated Detection of Driver Fatigue Based on AdaBoost Classifier with EEG Signals

Purpose: Driving fatigue has become one of the important causes of road accidents, there are many researches to analyze driver fatigue. EEG is becoming increasingly useful in the measuring fatigue state. Manual interpretation of EEG signals is impossible, so an effective method for automatic detection of EEG signals is crucial needed. Method: In order to evaluate the complex, unstable, and non-...

متن کامل

Empirical Comparison of Boosting

Methods for voting classiication algorithms, such as Bagging and AdaBoost, have been shown to be very successful in improving the accuracy of certain classiiers for artiicial and real-world datasets. We review these algorithms and describe a large empirical study comparing several variants in conjunction with a decision tree inducer (three variants) and a Naive-Bayes inducer. The purpose of the...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neural computation

دوره 16 4  شماره 

صفحات  -

تاریخ انتشار 2004